Adversarial Learning with Bayesian Hierarchical Mixtures of Experts
نویسندگان
چکیده
Many data mining applications operate in adversarial environment, for example, webpage ranking in the presence of web spam. A growing number of adversarial data mining techniques are recently developed, providing robust solutions under specific defense-attack models. Existing techniques are tied to distributional assumptions geared towards minimizing the undesirable impact of given attack models. However, the large variety of attack strategies renders the adversarial learning problem multimodal. Therefore, it calls for a more flexible modeling ideology for equivocal input. In this paper we present a Bayesian hierarchical mixtures of experts for adversarial learning. The technique groups data into soft partitions and fits simple function approximators, referred to as “experts”, within each. Experts are ranked using gating functions for each input. Ambiguous input is predicted competitively by multiple experts, while unambiguous input is effectively predicted by a single expert. Optimal attacks minimizing the likelihood of malicious data are modeled interactively at both expert and gating levels in the learning hierarchy. We demonstrate that our adversarial hierarchicalmixtures-of-experts learning model is robust against adversarial attacks on both artificial and real data.
منابع مشابه
Hierarchical Mixtures of Naive Bayesian Classifiers
Naive Bayesian classifiers tend to perform very well on a large number of problem domains, although their representation power is quite limited compared to more sophisticated machine learning algorithms. In this paper we study combining multiple naive Bayesian classifiers by using the hierarchical mixtures of experts system. This novel system, which we call hierarchical mixtures of naive Bayesi...
متن کاملBayesian Inference in Mixtures-of-Experts and Hierarchical Mixtures-of-Experts Models With an Application to Speech Recognition
Machine classi cation of acoustic waveforms as speech events is often di cult due to context-dependencies. A vowel recognition task with multiple speakers is studied in this paper via the use of a class of modular and hierarchical systems referred to as mixtures-of-experts and hierarchical mixtures-of-experts models. The statistical model underlying the systems is a mixture model in which both ...
متن کاملImprovement of generative adversarial networks for automatic text-to-image generation
This research is related to the use of deep learning tools and image processing technology in the automatic generation of images from text. Previous researches have used one sentence to produce images. In this research, a memory-based hierarchical model is presented that uses three different descriptions that are presented in the form of sentences to produce and improve the image. The proposed ...
متن کاملFully-Automatic Bayesian Piecewise Sparse Linear Models
Piecewise linear models (PLMs) have been widely used in many enterprise machine learning problems, which assign linear experts to individual partitions on feature spaces and express whole models as patches of local experts. This paper addresses simultaneous model selection issues of PLMs; partition structure determination and feature selection of individual experts. Our contributions are mainly...
متن کاملBayesian Normalized Gaussian Network and Hierarchical Model Selection Method
This paper presents a variational Bayes (VB) method for normalized Gaussian network, which is a mixture model of local experts. Based on the Bayesian framework, we introduce a meta-learning mechanism to optimize the prior distribution and the model structure. In order to search for the optimal model structure efficiently, we also develop a hierarchical model selection method. The performance of...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014